Data Engineer - Databricks | USA or Colombia

Plano, TX
Full Time
Experienced

Who We Are
MAS means "more" in Spanish, and MAS Global’s name reflects our mission to create educational and career opportunities for women and Latinos in tech—just like the ANDI-EAFIT scholarship that helped our Latina Founder rise from a low-income neighborhood in Medellín to being named a Top 100 Hispanic in Tech in the U.S.

Headquartered in Tampa, Florida with a strong presence in Colombia, Argentina, and across LATAM, we offer North American clients onshore and nearshore access to top tech talent. From full-stack engineering and cloud modernization to data and AI solutions, our digital pods deliver real impact. With a diverse team representing 10+ nationalities and a track record of results for clients like Johnson Controls, JPMorgan Chase, and Dell, our future is bright.

MAS Global is a 100% Hispanic- and women-owned company, recognized as a Great Place to Work and one of the fastest-growing companies in the U.S.

About this opportunity

Location: Onsite — Medellín, Colombia or USA: New York City, New Jersey, Columbus (Ohio), Wilmington (Delaware), Plano (Texas)

We are seeking a highly skilled Data Engineer to join an agile, cross-functional team working for a leading global financial institution. This organization provides innovative banking, investment, and financial services to millions of consumers and businesses worldwide, and is known for its focus on digital transformation and cutting-edge technology.

In this role, you will be responsible for designing and building robust data pipelines using Databricks and Python, optimizing large-scale data processing workflows, and enabling data-driven decision-making across a high-impact environment. You will collaborate closely with software engineers, analysts, and data scientists to ensure data reliability, scalability, and performance.

As part of a collaborative and agile team, you'll have the opportunity to work on complex challenges involving cloud infrastructure, real-time data flows, and enterprise-scale data governance — all within a culture that values innovation, continuous improvement, and high standards.

Note: This opportunity does NOT offer visa sponsorship and will only consider candidates who are legally entitled to work within the USA.


Who You Are
You are a skilled Data Engineer with a strong foundation in database management, ETL pipelines, and performance tuning. You are analytical, detail-oriented, and passionate about creating reliable data infrastructure to support business intelligence and decision-making.
 


You Have:

  • Education & Experience
    • Bachelor’s degree in Computer Science, Data Engineering, Information Systems, or related field (Master’s degree is a plus).

    • 5+ years of experience in data engineering or a related role.

    • 3+ years of hands-on experience working with Databricks in production environments.

    • Strong experience in Python for data processing and pipeline development.

    Technical Skills
    • Databricks:

      • Proficient in using Databricks for building and managing data pipelines.

      • Experience with Delta Lake, Unity Catalog, and Databricks Notebooks.

      • Familiarity with Databricks workflows, jobs, and cluster management.

    • Python:

      • Strong coding skills for data manipulation, ETL/ELT jobs, and automation.

      • Experience using libraries such as Pandas, PySpark, and NumPy.

    • Big Data & Cloud:

      • Experience working with cloud platforms (AWS, Azure, or GCP).

      • Familiarity with storage solutions like S3, ADLS, or GCS.

      • Strong knowledge of distributed data processing (Spark, PySpark).

    • Data Modeling & ETL/ELT:

      • Proficiency in designing and implementing data models and pipelines.

      • Experience in developing scalable ETL processes.

      • Knowledge of data warehouse concepts (Star/Snowflake schemas).

    • SQL & Databases:

      • Advanced SQL skills for data transformation and analysis.

      • Experience with relational (PostgreSQL, MySQL) and NoSQL databases.

    • CI/CD & Version Control:

      • Experience with Git and DevOps practices.

      • Familiarity with CI/CD tools and infrastructure-as-code for data deployments.

    Soft Skills
    • Strong analytical and problem-solving abilities.

    • Excellent communication and collaboration skills in cross-functional teams.

    • Ability to work in a fast-paced, agile environment with evolving priorities.

    Preferred Qualifications
    • Databricks or Azure/AWS certifications (e.g., Databricks Certified Data Engineer Associate/Professional).

    • Experience with ML pipelines or working alongside data scientists.

    • Familiarity with orchestration tools (e.g., Airflow, Azure Data Factory, Prefect).

    • Experience in data governance, data quality, and metadata management.


Our Values
At MAS, we believe in the power of “MORE.” This belief drives our commitment to excellence, accountability, collaboration, and growth:

  • Client Value Obsession – We deliver the highest levels of excellence by proactively addressing client needs.

  • Driven by Action & Accountability – We uphold the highest standards of reliability and follow-through.

  • One Team, One Voice – We foster inclusivity, value diverse perspectives, and support one another as a unified team.

  • Growth Mindset – We thrive on change and innovation through continuous learning and personal development.


Why MAS Global?
Join a team where your voice matters. At MAS Global, people from diverse backgrounds come together to make an impact and bring their authentic selves to work. As a woman- and minority-owned business, we are proud to be an equal opportunity employer. We embrace inclusivity and accessibility and provide reasonable accommodations for applicants with disabilities.

Share

Apply for this position

Required*
Apply with Indeed
We've received your resume. Click here to update it.
Attach resume as .pdf, .doc, .docx, .odt, .txt, or .rtf (limit 5MB) or Paste resume

Paste your resume here or Attach resume file

Human Check*